Goto

Collaborating Authors

 Humberside


Watch: Drone footage shows scale of one illegal waste dump

BBC News

Hundreds of illegal dumps are operating across England, including at least 11 so-called super sites containing tens of thousands of tonnes of rubbish, a BBC investigation has found. Drone footage showed one of the waste dumps in Over, Gloucestershire. Most sites are in countryside locations, often hidden, and on what should be agricultural land. Police say many are run by organised crime gangs, who are making cash by charging much less than legitimate operators to take and bury waste. How the great outdoors went from an escape from the nine to five to a full-time social media job.


Rubrik's Cube: Testing a New Rubric for Evaluating Explanations on the CUBE dataset

Galvan-Sosa, Diana, Gaudeau, Gabrielle, Kavumba, Pride, Li, Yunmeng, gu, Hongyi, Yuan, Zheng, Sakaguchi, Keisuke, Buttery, Paula

arXiv.org Artificial Intelligence

The performance and usability of Large-Language Models (LLMs) are driving their use in explanation generation tasks. However, despite their widespread adoption, LLM explanations have been found to be unreliable, making it difficult for users to distinguish good from bad explanations. To address this issue, we present Rubrik's CUBE, an education-inspired rubric and a dataset of 26k explanations, written and later quality-annotated using the rubric by both humans and six open- and closed-source LLMs. The CUBE dataset focuses on two reasoning and two language tasks, providing the necessary diversity for us to effectively test our proposed rubric. Using Rubrik, we find that explanations are influenced by both task and perceived difficulty. Low quality stems primarily from a lack of conciseness in LLM-generated explanations, rather than cohesion and word choice. The full dataset, rubric, and code will be made available upon acceptance.


'Orwellian' Surveillance Cameras Face Legal Battle

Forbes - Tech

Civil liberties group Big Brother Watch has launched a legal challenge against the use of automatic facial recognition technology by London's Metropolitan Police force. The privacy campaigners described the Met's "China-style" facial recognition system, which uses AI software to match people's faces to a criminal database, as "dangerously authoritarian." "Facial recognition is the latest Orwellian mass surveillance tool to be lawlessly rolled out by the state," Big Brother Watch writes on the campaign website. "These real-time facial recognition cameras are biometric checkpoints, identifying members of the public without their knowledge. Police have begun feeding secret watchlists to the cameras, containing not only criminals but suspects, protesters, football fans and innocent people with mental health problems."


Police facial recognition system faces legal challenge

BBC News

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology. However, it has proved controversial, with one watchdog describing its use in public places as "very intrusive".